Definitions
from the GNU version of the Collaborative International Dictionary of English.
- noun (Statistics) A random process (Markov process) in which the probabilities of discrete states in a series depend only on the properties of the immediately preceding state or the next preceeding state, independent of the path by which the preceding state was reached. It differs from the more general Markov process in that the states of a
Markov chain are discrete rather than continuous. Certain physical processes, such as diffusion of a molecule in a fluid, are modelled as aMarkov chain . See alsorandom walk .
from Wiktionary, Creative Commons Attribution/Share-Alike License.
- noun probability theory A discrete-time
stochastic process with the Markov property.
from WordNet 3.0 Copyright 2006 by Princeton University. All rights reserved.
- noun a Markov process for which the parameter is discrete time values
Etymologies
Sorry, no etymologies found.
Support
Help support Wordnik (and make this page ad-free) by adopting the word Markov chain.
Examples
Sorry, no example sentences found.
ruzuzu commented on the word Markov chain
"A Markov chain (discrete-time Markov chain or DTMC), named after Andrey Markov, is a random process that undergoes transitions from one state to another on a state space. It must possess a property that is usually characterized as "memorylessness": the probability distribution of the next state depends only on the current state and not on the sequence of events that preceded it. This specific kind of "memorylessness" is called the Markov property. Markov chains have many applications as statistical models of real-world processes."
-- https://en.wikipedia.org/w/index.php?title=Markov_chain&oldid=693268836
December 3, 2015